A Boosting method in Combination with Decision Trees

نویسندگان

  • Kristína Machová
  • Miroslav Puszta
  • Peter Bednár
چکیده

This paper describes boosting – a method, which can improve results of classification algorithms. The use of this method aims at classification algorithms generating decision trees. A modification of the AdaBoost algorithm was implemented. Results of performance tests focused on the use of the boosting method on binary decision trees are presented. The minimum number of decision trees, which enables improvement of the classification performed by a base machine learning algorithm, was found. The tests were carried out using the Reuters 21578 collection of documents as well as documents from an internet portal of TV Markíza.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Boosting Pruning Problem (short Submission)

Boosting is a powerful method for improving the predictive accuracy of classiiers. The AdaBoost algorithm of Freund and Schapire has been successfully applied to many domains 2, 10, 12] and the combination of AdaBoost with the C4.5 decision tree algorithm has been called the best oo-the-shelf learning algorithm in practice. Unfortunately, in some applications, the number of decision trees requi...

متن کامل

Stochastic Attribute Selection Committees withMultiple Boosting : Learning More

Classiier learning is a key technique for KDD. Approaches to learning classiier committees, including Boosting, Bagging, Sasc, and SascB, have demonstrated great success in increasing the prediction accuracy of decision trees. Boosting and Bagging create diierent classiiers by modifying the distribution of the training set. Sasc adopts a diierent method. It generates committees by stochastic ma...

متن کامل

Boosting bonsai trees for efficient features combination: application to speaker role identification

In this article, we tackle the problem of speaker role detection from broadcast news shows. In the literature, many proposed solutions are based on the combination of various features coming from acoustic, lexical and semantic information with a machine learning algorithm. Many previous studies mention the use of boosting over decision stumps to combine efficiently these features. In this work,...

متن کامل

On the Boosting Pruning Problem

Boosting is a powerful method for improving the predic-tive accuracy of classiiers. The AdaBoost algorithm of Freund and Schapire has been successfully applied to many domains 2, 10, 12] and the combination of AdaBoost with the C4.5 decision tree algorithm has been called the best oo-the-shelf learning algorithm in practice. Unfortunately , in some applications, the number of decision trees req...

متن کامل

Boosting recombined weak classifiers

Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the combination of weak classifiers. Therefore, it is possible to use boosting methods with very simple base classifiers. One of the most simple classifiers are decision stumps, decision trees with only one decision node. This...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004